skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Grote, Dustin"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract Self-report assessments are used frequently in higher education to assess a variety of constructs, including attitudes, opinions, knowledge, and competence. Systems thinking is an example of one competence often measured using self-report assessments where individuals answer several questions about their perceptions of their own skills, habits, or daily decisions. In this study, we define systems thinking as the ability to see the world as a complex interconnected system where different parts can influence each other, and the interrelationships determine system outcomes. An alternative, less-common, assessment approach is to measure skills directly by providing a scenario about an unstructured problem and evaluating respondents’ judgment or analysis of the scenario (scenario-based assessment). This study explored the relationships between engineering students’ performance on self-report assessments and scenario-based assessments of systems thinking, finding that there were no significant relationships between the two assessment techniques. These results suggest that there may be limitations to using self-report assessments as a method to assess systems thinking and other competencies in educational research and evaluation, which could be addressed by incorporating alternative formats for assessing competence. Future work should explore these findings further and support the development of alternative assessment approaches. 
    more » « less
  2. Abstract BackgroundDespite many initiatives to improve graduate student and faculty diversity in engineering, there has been little or no change in the percentage of people from racially minoritized backgrounds in either of these groups. Purpose/HypothesisThe purpose of this paper is to counter the scarcity fallacy, in which institutions blame the “shortage” of qualified people from traditionally marginalized backgrounds for their own lack of representation, related to prospective PhD students and prospective faculty from traditionally marginalized groups. This study identifies the BS‐to‐PhD and PhD‐to‐tenure‐track‐faculty institutional pathways of Black/African American and Hispanic/Latino engineering doctorate recipients. Design/MethodUsing the US Survey of Earned Doctorates, we tracked the BS‐to‐PhD institutional pathways of 3952 Black/African American and 5732 Hispanic/Latino engineering PhD graduates. We also used the Survey of Doctorate Recipients to track the PhD‐to‐tenure‐track faculty pathways of 104 Black/African American and 211 Hispanic/Latino faculty. ResultsThe majority of Black/African American and Hispanic/Latino PhD graduates in this study did not earn their BS degrees from Top 25 institutions, but rather from Not Top 25, non‐US, and minority‐serving institutions. The results also show the relatively small proportion of PhD earners and faculty members who move into highly ranked institutions after earning a bachelor's degree from outside this set of institutions. ConclusionsThe findings of this study have important implications for graduate student and faculty recruitment by illustrating that recruitment from a narrow range of institutions (i.e., Top 25 institutions) is unlikely to result in increased diversity among racially minoritized PhDs and faculty in engineering. 
    more » « less
  3. Abstract We introduce the Lake Urmia Vignette (LUV) as a tool to assess individuals' understanding of complexity in socio‐environmental systems. LUV is based on a real‐world case and includes a short vignette describing an environmental catastrophe involving a lake. Over a few decades, significant issues have manifested themselves at the lake because of various social, political, economic, and environmental factors. We design a rubric for assessing responses to a prompt. A pilot test with a sample of 30 engineering graduate students is conducted. We compare responses to LUV with other measures. Our findings suggest that students' understanding of complexity is positively associated with their understanding of systems concepts such as feedback loops but not with other possible variables such as self‐reported systems thinking skills or systems‐related coursework. Based on the provided instructions, researchers can use LUV as a novel assessment tool to examine understanding of complexity in socio‐environmental systems. © 2020 System Dynamics Society 
    more » « less